Trainability in Recurrent Neural Networks
نویسندگان
چکیده
Two potential bottlenecks on the expressiveness of recurrent neural networks (RNNs) are their ability to store information about the task in their parameters, and to store information about the input history in their units. We show experimentally that all common RNN architectures achieve nearly the same per-task and per-unit capacity bounds with careful training, for a variety of tasks and stacking depths. They can store an amount of task information which is linear in the number of parameters, and is approximately 5 bits per parameter. They can additionally store approximately one real number from their input history per hidden unit. We further find that for several tasks it is the per-task parameter capacity bound that determines performance. These results suggest that many previous results comparing RNN architectures are driven primarily by differences in training effectiveness, rather than differences in capacity. Supporting this observation, we compare training difficulty for several architectures, and show that vanilla RNNs are far more difficult to train, yet have slightly higher capacity. Finally, we propose two novel RNN architectures, one of which is easier to train than the LSTM or GRU for deeply stacked architectures.
منابع مشابه
Path-Normalized Optimization of Recurrent Neural Networks with ReLU Activations
We investigate the parameter-space geometry of recurrent neural networks (RNNs), and develop an adaptation of path-SGD optimization method, attuned to this geometry, that can learn plain RNNs with ReLU activations. On several datasets that require capturing long-term dependency structure, we show that path-SGD can significantly improve trainability of ReLU RNNs compared to RNNs trained with SGD...
متن کاملMinimalRNN: Toward More Interpretable and Trainable Recurrent Neural Networks
We introduce MinimalRNN, a new recurrent neural network architecture that achieves comparable performance as the popular gated RNNs with a simplified structure. It employs minimal updates within RNN, which not only leads to efficient learning and testing but more importantly better interpretability and trainability. We demonstrate that by endorsing the more restrictive update rule, MinimalRNN l...
متن کاملRobust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays
In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...
متن کاملSolving Linear Semi-Infinite Programming Problems Using Recurrent Neural Networks
Linear semi-infinite programming problem is an important class of optimization problems which deals with infinite constraints. In this paper, to solve this problem, we combine a discretization method and a neural network method. By a simple discretization of the infinite constraints,we convert the linear semi-infinite programming problem into linear programming problem. Then, we use...
متن کاملEfficient Short-Term Electricity Load Forecasting Using Recurrent Neural Networks
Short term load forecasting (STLF) plays an important role in the economic and reliable operation ofpower systems. Electric load demand has a complex profile with many multivariable and nonlineardependencies. In this study, recurrent neural network (RNN) architecture is presented for STLF. Theproposed model is capable of forecasting next 24-hour load profile. The main feature in this networkis ...
متن کاملApplication of artificial neural networks on drought prediction in Yazd (Central Iran)
In recent decades artificial neural networks (ANNs) have shown great ability in modeling and forecasting non-linear and non-stationary time series and in most of the cases especially in prediction of phenomena have showed very good performance. This paper presents the application of artificial neural networks to predict drought in Yazd meteorological station. In this research, different archite...
متن کامل